Nonlinear dimensionality reduction through an improved Isomap algorithm for the classification of large datasets
نویسنده
چکیده
An ideal band-pass filter removes the frequency components of a time series that lie within a particular range of frequencies. In practice, it is difficult to construct an "ideal" band-pass filter as it requires an infinite number of data points. Therefore, an approximation to an ideal filter is used to extract the components of a time series in a particular frequency range, such as business cycles with known duration. We present several examples of filters in economics and finance, exponentially weighted moving average (EWMA) estimate of volatility in a foreign exchange market, Hodrick and Prescott filter in macroeconomics, and other examples of filters used in the technical analysis of prices in financial markets. 3 Market Risk Prediction under Long Memory: When VaR is Higher than Expected Harald Kinateder , Universität Passau, [email protected], Niklas Wagner Multi-period value-at-risk (VaR) forecasts are essential in many financial risk management applications. This paper addresses financial risk prediction for equity markets under long range dependence. Our empirical study of established equity markets covers daily index observations during the period January 1975 to December 2007. We document substantial long range dependence in absolute as well as squared returns, indicating a significant influence of long memory effects on volatility. We account for long memory in multi-period value-at-risk forecasts via a scaling based modification of the GARCH(1,1) forecast. We study the accuracy of VaR predictions for five days, ten days, 20 days and 60 days ahead forecast horizons. As benchmark model we use a fully parametric GARCH setting whose multi-day volatility is calculated by the Drost and Nijman (1993) formula. Moreover, the benchmark model uses the Cornish-Fisher expansion to calculate quantiles of the innovations distribution. Our results show that the scaling based GARCH-LM technique can improve VaR forecasting results. The scaling based GARCH-LM method outperforms the benchmark model for forecast horizons of five and ten days significantly. This outperformance is only in part due to higher levels of our risk forecasts. Even after controlling for the unconditional VaR levels of both approaches, our approach delivers results which are not dominated by the benchmark approach. In all, our results confirm that poorly estimated average capital levels can not be compensated by adequate adjustment to time varying market conditions. 4 Applying The Detrended Cross-Correlation Analysis To Six Major Stock Index Returns vahid saadi , management and economics, sharif university of technology, [email protected], Milad Nozari, Shiva Zamani
منابع مشابه
Supervised Nonlinear Dimensionality Reduction Based on Evolution Strategy
Most of the classifiers suffer from the curse of dimensionality during classification of high dimensional image and non-image data. In this paper, we introduce a new supervised nonlinear dimensionality reduction (S-NLDR) algorithm called supervised dimensionality reduction based on evolution strategy (SDRES) for both image and nonimage data. The SDRES method uses the power of evolution strategy...
متن کاملDistributed Knowledge Discovery with Non Linear Dimensionality Reduction
Data mining tasks results are usually improved by reducing the dimensionality of data. This improvement however is achieved harder in the case that data lay on a non linear manifold and are distributed across network nodes. Although numerous algorithms for distributed dimensionality reduction have been proposed, all assume that data reside in a linear space. In order to address the non-linear c...
متن کاملLinear versus nonlinear dimensionality reduction for banks' credit rating prediction
Dimensionality reduction methods have shown their usefulness for both supervised and unsupervised tasks in a wide range of application domains. Several linear and nonlinear approaches have been proposed in order to derive meaningful low-dimensional representations of high-dimensional data. Among nonlinear algorithms manifold learning methods, such as isometric feature mapping (Isomap), have rec...
متن کاملENSO dynamics in current climate models: an investigation using nonlinear dimensionality reduction
Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have...
متن کاملLinear and Nonlinear Dimensionality Reduction in fMRI Data for Picture-Sentence Classification
fMRI data is represented in a space with very high dimensionality. Because of this, classifiers such as SVM and Naive Bayes may overfit this data. Dimensionality reduction methods are intended to extract features from data in a high dimensional space. Training a classifier on data in a lower dimension may improve the true error of the classifier beyond the performance obtained by training in a ...
متن کاملFast Training of Graph-Based Algorithms for Nonlinear Dimensionality Reduction
Introduction Dimensionality reduction algorithms have long been used either for exploratory analysis of a high-dimensional dataset, to reveal structure such as clustering, or as a preprocessing step, by extracting low-dimensional features that are useful for classification or other tasks. Here we focus on dimensionality reduction algorithms where a dataset consisting of N objects is represented...
متن کامل